AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
235GB Pretraining

# 235GB Pretraining

Jasmine 350M
JASMINE is a series of Arabic GPT models designed for few-shot learning, with parameters ranging from 300 million to 6.7 billion, pretrained on 235GB of text data.
Large Language Model Transformers
J
UBC-NLP
81
5
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase